Implementing a new method for discriminant analysis when group covariance matrices are nearly singular
ثبت نشده
چکیده
We consider a uni ed description of classi cation rules for nearly singular covariance matrices. When the covariance matrices of the groups or the pooled covariance matrix become nearly singular, bayesian classi cation rules become seriously unstable. Several procedures have been proposed to tackle this problem, e.g. SIMCA, and Regularized Discriminant Analysis. N s and Indahl (1998) discovered common properties for all of these procedures and proposed a uni ed classi er that incorporates the functionality of them all. Since the uni ed approach needs many parameters, they also proposed an alternative classi er with fewer parameters. We implemented both classi ers and compared them in a simulation study to the procedures RDA, LDA, and QDA. To enhance the comparability of our results we based the simulation study on the study of Friedman (1989). In the implementation, we used a combination of the Nelder-Mead Simplexalgorithm and Simulated Annealing (Bohachevsky et al. (1986)) to optimize the classi cation error directly.
منابع مشابه
Visual Analysis of the Use of Mixture Covariance Matrices in Face Recognition
The quadratic discriminant (QD) classifier has proved to be simple and effective in many pattern recognition problems. However, it requires the computation of the inverse of the sample group covariance matrix. In many biometric problems, such as face recognition, the number of training patterns is considerably smaller than the number of features, and therefore the covariance matrix is singular....
متن کاملDiscriminant Analysis, a Powerful Classification Technique in Predictive Modeling
Discriminant analysis is one of the classical classification techniques used to discriminate a single categorical variable using multiple attributes. Discriminant analysis also assigns observations to one of the pre-defined groups based on the knowledge of the multi-attributes. When the distribution within each group is multivariate normal, a parametric method can be used to develop a discrimin...
متن کاملFace Recognition Based on Non-Negative Factorization and FLDA for Single Training Image per Person
Abstract: Dimensionality reduction is performed by both Principal Component Analysis (PCA) and Fisher Linear Discriminant Analysis (FLDA). Covariance matrix and Eigen vector approach is followed in PCA. FLDA finds within class and between class scatter matrices. In some situations, within class scatter matrix may become singular. Normally singular matrix does not have inverse. Two or more virtu...
متن کاملAn `∞ Eigenvector Perturbation Bound and Its Application to Robust Covariance Estimation
In statistics and machine learning, people are often interested in the eigenvectors (or singular vectors) of certain matrices (e.g. covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. One usually employs Davis-Kahan sin θ theorem to bound the difference between the eigenvecto...
متن کاملComparison of statistical pattern - recognition algorithms for hybrid processing . II . Eigenvector - based algorithm
The pattern-recognition algorithms based on eigenvector analysis (group 2) are theoretically and experimentally compared. Group 2 consists of Foley-Sammon (F-S) transform, Hotelling trace criterion (HTC), FukunagaKoontz (F-K) transform, linear discriminant function (LDF), and generalized matched filter (GMF) algorithms. It is shown that all eigenvector-based algorithms can be represented in a g...
متن کامل